Deep learning for sleep stages classification: modified rectified linear unit activation function and modified orthogonal weight initialisation

نویسندگان

چکیده

Each stage of sleep can affect human health, and not getting enough at any may lead to disorder like parasomnia, apnea, insomnia, etc. Sleep-related diseases could be diagnosed using Convolutional Neural Network Classifier. However, this classifier has been successfully implemented into classification systems due high complexity low accuracy classification. The aim research is increase the reduce learning time proposed system used a modified Orthogonal Adam optimisation technique improve gradient saturation problem that occurs sigmoid activation function. uses Leaky Rectified Linear Unit (ReLU) instead function as an called Enhanced Sleep Stage Classification (ESSC) six different databases for training testing model on stages. These are University College Dublin database (UCD), Beth Israel Deaconess Medical Center MIT (MIT-BIH), European Data Format (EDF), EDF Extended, Montreal Archive Studies (MASS), Heart Health Study (SHHS). Our results show does exist anymore. optimiser helps noise which in turn result faster convergence time. speed ESSC increased along with better compared state art solution.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep Learning with S-Shaped Rectified Linear Activation Units

Rectified linear activation units are important components for state-of-the-art deep convolutional networks. In this paper, we propose a novel S-shaped rectified linear activation unit (SReLU) to learn both convex and non-convex functions, imitating the multiple function forms given by the two fundamental laws, namely the Webner-Fechner law and the Stevens law, in psychophysics and neural scien...

متن کامل

Deep Learning using Rectified Linear Units (ReLU)

We introduce the use of rectified linear units (ReLU) as the classification function in a deep neural network (DNN). Conventionally, ReLU is used as an activation function in DNNs, with Softmax function as their classification function. However, there have been several studies on using a classification function other than Softmax, and this study is an addition to those. We accomplish this by ta...

متن کامل

Orthogonal polynomials for modified Gegenbauer weight and corresponding quadratures

In this paper we consider polynomials orthogonal with respect to the linear functional L : P → C, defined by L[p] = ∫ 1 −1 p(x)(1 − x 2)λ−1/2 exp(iζ x) dx, where P is a linear space of all algebraic polynomials, λ > −1/2 and ζ ∈ R. We prove the existence of such polynomials for some pairs of λ and ζ , give some their properties, and finally give an application to numerical integration of highly...

متن کامل

Norm-preserving Orthogonal Permutation Linear Unit Activation Functions (OPLU)

We propose a novel activation function that implements piecewise orthogonal non-linear mappings based on permutations. It is straightforward to implement, and very computationally efficient, also it has little memory requirements. We tested it on two toy problems for feedforward and recurrent networks, it shows similar performance to tanh and ReLU. OPLU activation function ensures norm preserva...

متن کامل

Modified homotopy perturbation method for solving non-linear oscillator's ‎equations

In this paper a new form of the homptopy perturbation method is used for solving oscillator differential equation, which yields the Maclaurin series of the exact solution. Nonlinear vibration problems and differential equation oscillations have crucial importance in all areas of science and engineering. These equations equip a significant mathematical model for dynamical systems. The accuracy o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Multimedia Tools and Applications

سال: 2022

ISSN: ['1380-7501', '1573-7721']

DOI: https://doi.org/10.1007/s11042-022-12372-7